22 research outputs found

    Static Output-Feedback Control of Markov Jump Linear Systems Without Mode Observation

    Get PDF

    A Distance-based Framework for Gaussian Processes over Probability Distributions

    Get PDF
    Gaussian processes constitute a very powerful and well-understood method for non-parametric regression and classification. In the classical framework, the training data consists of deterministic vector-valued inputs and the corresponding (noisy) measurements whose joint distribution is assumed to be Gaussian. In many practical applications, however, the inputs are either noisy, i.e., each input is a vector-valued sample from an unknown probability distribution, or the probability distributions are the inputs. In this paper, we address Gaussian process regression with inputs given in form of probability distributions and propose a framework that is based on distances between such inputs. To this end, we review different admissible distance measures and provide a numerical example that demonstrates our framework

    Neutrino spin relaxation in medium with stochastic characteristics

    Full text link
    The helicity evolution of a neutrino interacting with randomly moving and polarized matter is studied. We derive the equation for the averaged neutrino helicity. The type of the neutrino interaction with background fermions is not fixed. In the particular case of a tau-neutrino interacting with ultrarelativistic electron-positron plasma we obtain the expression for the neutrino helicity relaxation rate in the explicit form. We study the neutrino spin relaxation in the relativistic primordial plasma. Supposing that the conversion of left-handed neutrinos into right-handed ones is suppressed at the early stages of the Universe evolution we get the upper limit on the tau-neutrino mass.Comment: 6 pages, RevTeX4; 2 references added; more detailed discussion of correlation functions and cosmological neutrinos is presented; version to be published in Int. J. Mod. Phys.

    Approximate Stochastic Optimal Control of Smooth Nonlinear Systems and Piecewise Linear Systems

    Get PDF

    Unstable Relics as a Source of Galactic Positrons

    Get PDF
    We calculate the fluxes of 511 KeV photons from the Galactic bulge caused by positrons produced in the decays of relic particles with masses less than 100 MeV. In particular, we tighten the constraints on sterile neutrinos over a large domain of the mass--mixing angle parameter space, where the resulting photon flux would significantly exceed the experimental data. At the same time, the observed photon fluxes can be easily caused by decaying sterile neutrinos in the mass range 1 MeV < m_sterile < 50 MeV with the cosmological abundance typically within 10^{-9} < Omega_sterile < 10^{-5}, assuming that Omega_sterile comes entirely from the conversion of active neutrinos in the early Universe. Other candidates for decaying relics such as neutral (pseudo)scalar particles coupled to leptons with the gravitational strength can be compatible with the photon flux, and can constitute the main component of cold dark matter.Comment: Latex, 14 pages, 3 figures, Calculation of cosmological background is include

    Dynamic compensation of Markov jump linear systems without mode observation

    Get PDF

    Bridging the Gap Between Multi-Step and One-Shot Trajectory Prediction via Self-Supervision

    Full text link
    Accurate vehicle trajectory prediction is an unsolved problem in autonomous driving with various open research questions. State-of-the-art approaches regress trajectories either in a one-shot or step-wise manner. Although one-shot approaches are usually preferred for their simplicity, they relinquish powerful self-supervision schemes that can be constructed by chaining multiple time-steps. We address this issue by proposing a middle-ground where multiple trajectory segments are chained together. Our proposed Multi-Branch Self-Supervised Predictor receives additional training on new predictions starting at intermediate future segments. In addition, the model 'imagines' the latent context and 'predicts the past' while combining multi-modal trajectories in a tree-like manner. We deliberately keep aspects such as interaction and environment modeling simplistic and nevertheless achieve competitive results on the INTERACTION dataset. Furthermore, we investigate the sparsely explored uncertainty estimation of deterministic predictors. We find positive correlations between the prediction error and two proposed metrics, which might pave way for determining prediction confidence.Comment: 8 pages, 6 figures, to be published in 34th IEEE Intelligent Vehicles Symposium (IV

    Unscented Autoencoder

    Full text link
    The Variational Autoencoder (VAE) is a seminal approach in deep generative modeling with latent variables. Interpreting its reconstruction process as a nonlinear transformation of samples from the latent posterior distribution, we apply the Unscented Transform (UT) -- a well-known distribution approximation used in the Unscented Kalman Filter (UKF) from the field of filtering. A finite set of statistics called sigma points, sampled deterministically, provides a more informative and lower-variance posterior representation than the ubiquitous noise-scaling of the reparameterization trick, while ensuring higher-quality reconstruction. We further boost the performance by replacing the Kullback-Leibler (KL) divergence with the Wasserstein distribution metric that allows for a sharper posterior. Inspired by the two components, we derive a novel, deterministic-sampling flavor of the VAE, the Unscented Autoencoder (UAE), trained purely with regularization-like terms on the per-sample posterior. We empirically show competitive performance in Fr\'echet Inception Distance (FID) scores over closely-related models, in addition to a lower training variance than the VAE
    corecore